skip to main content


Search for: All records

Creators/Authors contains: "Walsh, Cole"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Pamucar, Dragan (Ed.)
    Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments. 
    more » « less
  2. Abstract

    To further explore the effect of weighted arms on toddler's performance in problem solving (Arterberry et al., 2018,Infancy, 23(2), 173), the present study explored scale errors and categorization, two instances where infants appear to show more advanced knowledge than toddlers. Experiment 1 (N = 67) used a novel task for inducing scale errors among 24‐ to 29‐month‐olds. Results replicated rates of scale errors found in previous research that used different tasks. Experiment 2 used sequential touching (N = 31) and sorting measures (N = 23) to test categorization in 24‐month‐old children. In both measures, children showed categorization at the basic level when there was high contrast between the exemplars, but not at a basic level with low contrast or a subordinate level. In Experiments 1 and 2, half the participants were tested while wearing weighted wristbands. Weighting the arms did not affect error rates, in contrast to previous research showing that weights improved performance in search tasks. The findings are discussed in light of children's difficulty in integrating perception, cognition, and action.

     
    more » « less
  3. Abstract

    The newly developed Four‐Dimensional Ecology Education (4DEE) framework, produced by the Ecological Society of America, provides updated guidance for undergraduate instruction. To help instructors align their courses to this framework and assess student progress toward its goals, we have recoded the comprehensive programmatic assessment Ecology and Evolution‐Measuring Achievement and Progression in Science (EcoEvo‐MAPS) and reanalyzed a national dataset of over 2000 undergraduate student responses. Here, we show how the EcoEvo‐MAPSquestions align to the 4DEEframework and provide student performance data across the dimensions and elements. We also include information from student interviews to help inspire educators to develop new lessons, additional assessment questions, and other course materials in these areas. Finally, we provide information on a new web‐based portal that allows instructors to easily administer EcoEvo‐MAPSto students and receive an automatically generated score report that aligns results to the 4DEEframework.

     
    more » « less